On the Discrepancy Function in Arbitrary Dimension, Close to L
نویسنده
چکیده
Our subject is irregularities of distribution of points with respect to rectangles in the unit cube. It is a familiar theme of the subject is to show that no matter how N points are selected, they must be far from uniform. We give a new proof of a well-known theorem in the subject (Halász, 1981), concerning the L norm of the Discrepancy function, and show that this result admits an extension to arbitrary dimension. We also make some remarks on the Discrepancy function and Hardy space.
منابع مشابه
Caratheodory dimension for observers
In this essay we introduce and study the notion of dimension for observers via Caratheodory structures and relative probability measures. We show that the dimension as a three variables function is an increasing function on observers, and decreasing function on the cuts of an observer. We find observers with arbitrary non-negative dimensions. We show that Caratheodory dimension for obs...
متن کاملCasimir effects of nano objects in fluctuating scalar and electromagnetic fields: Thermodynamic investigating
Casimir entropy is an important aspect of casimir effect and at the nanoscale is visible. In this paper, we employ the path integral method to obtain a general relation for casimir entropy and internal energy of arbitrary shaped objects in the presence of two, three and four dimension scalar fields and the electromagnetic field. For this purpose, using Lagrangian and based on a perturb...
متن کاملDiscrepancy, Chaining and Subgaussian Processes
We show that for a typical coordinate projection of a subgaussian class of functions, the infimum over signs inf(εi) supf∈F | ∑k i=1 εif(Xi)| is asymptotically smaller than the expectation over signs as a function of the dimension k, if the canonical Gaussian process indexed by F is continuous. To that end, we establish a bound on the discrepancy of an arbitrary subset of R using properties of ...
متن کاملدستهبندی دادههای دوردهای با ابرمستطیل موازی محورهای مختصات
One of the machine learning tasks is supervised learning. In supervised learning we infer a function from labeled training data. The goal of supervised learning algorithms is learning a good hypothesis that minimizes the sum of the errors. A wide range of supervised algorithms is available such as decision tress, SVM, and KNN methods. In this paper we focus on decision tree algorithms. When we ...
متن کاملIdeal of Lattice homomorphisms corresponding to the products of two arbitrary lattices and the lattice [2]
Abstract. Let L and M be two finite lattices. The ideal J(L,M) is a monomial ideal in a specific polynomial ring and whose minimal monomial generators correspond to lattice homomorphisms ϕ: L→M. This ideal is called the ideal of lattice homomorphism. In this paper, we study J(L,M) in the case that L is the product of two lattices L_1 and L_2 and M is the chain [2]. We first characterize the set...
متن کامل